Computing Substrates and Life
نویسندگان
چکیده
Alive matter distinguishes itself from inanimate matter by actively maintaining a high degree of inhomogenous organisation. Information processing is quintessential to this capability. The present paper inquires into the degree to which the information processing aspect of living systems can be abstracted from the physical medium of its implementation. Information processing serving to sustain the complex organisation of a living system faces both the harsh reality of real-time requirements and severe constraints on energy and material that can be expended on the task. This issue is of interest for the potential scope of Artificial Life and its interaction with Synthetic Biology. It is pertinent also for information technology. With regard to the latter aspect, the use of a living cell in a robot control architecture is considered. 1 Life as information process The difference between the living and non-living is not, as once supposed, a material difference or a difference in the applicable laws of nature [1]. Now, life appears to be delimited only by a peculiar organisation of the very same matter that forms the remaining non-living universe. This organisation can be sustained only by active maintenance which in turn necessitates the processing of information. As a consequence, life without computation is inconceivable. This is true down to the simplest organisms and even their molecular constituents. The macromolecules that underly the structure and function of living systems are not simply products of chemical reactions; they are individually assembled in sophisticated and tightly controlled production processes with fine grained quality control mechanisms—all of which require computation. Endowed with an essential information processing capability, organisms recruited it and extended it for other tasks, most prominently for acquiring nutrients, for avoiding hazards, and for reproduction. But if one attempts the implementation of life-like artificial devices then the discrepancy between formal computation of practicable complexity and the real-time requirements in an open physical world becomes all too apparent. The question arises whether the intertwining of information processing and material processes innate to organisms may confer computational capabilities that in practice surpass conventional computing methods? 1.1 Abstract computation Information and operations on it can be described independent of any physical implementation. This approach turned out to be quite fruitful. Hartley derived the familiar measure of information from symbol frequencies and thus deliberately removed the physical information carrier from the picture [2]. Similarly the formalisation of a human computer by Turing [3] enabled the study of computation independent of an actual realisation. Turing formulated his abstraction to show that problems do exist that cannot be solved by a computer, even if no constraints are placed on available storage space and the time it will take to arrive at a result. His abstraction also entailed a simple but general machine model for computation which subsequently turned out to be equivalent to several other formalisations of computation (in particular also equivalent to recursive functions) and was highly influential in the field of theoretical computer science. It is now generally believed that anything that in principle can be computed can also be computed on Turing’s machine [4]. In fact, the mathematical notion of what is “effectively computable” refers now to the class of problems that can be solved by Turing’s machine model. Such machines, capable of carrying out any computation, are called universal. It is worth emphasising that a universal machine is always a hypothetical construct—a machine that would be restricted to a finite memory cannot be universal. Interestingly, for a machine to be universal it does not require a sophisticated mechanism nor a complex architecture. A processor with an instruction set of only the two commands ‘increment’ and ‘decrement with conditional jump’ coupled to an unlimited random access memory would be universal in the above sense. Or, for instance, closer to Turing’s original formulation, a finite state automaton with only seven states and a capability to manipulate symbols in a sequential access memory is sufficient to be universal [4]. Accordingly, from the standpoint of what is in principle computable, many information processing systems have essentially the same power. Thus, no matter how much more complex the architecture of an information processor is compared to the aforementioned simple universal machines, it will not be able to perform a computation that could not also be performed by any of the simple universal machines. After what has been stated regarding the equivalence of information processors it is reasonable to suppose that the computation requisite for life falls within the realm of the universal machines. It should therefore in principle be possible to abstract the information processing aspect of living matter by means of a universal computational model. This would seem to indicate that it is possible to satisfactorily capture the information processing aspect of living systems by a formal implementation, for instance, on a general purpose digital computer. But, as a matter of fact, attempts to replicate the essence of life in abstracted information processes have not resulted in convincing demonstrations of life-like phenomena. Given that Biology hardly ever takes explicit heed of the role information processing plays in the alive state of matter, a partial explanation of this may be that the processes themselves are not sufficiently understood to be formalised. Another explanation, however, arises from the possibility that the physical substrate that implements the computation is of greater relevance than the abstraction outlined above would permit. It is the latter possibility on which we will focus for the remainder of this paper. 1.2 Real computation The previous section delineated a picture in which all reasonably complete computing machines have the same ultimate theoretical capability. This finding hinges on ignoring requirements in memory space and execution time. The picture changes radically if not hypothetical formal machines, but practical devices are concerned. Then computability is constrained by physical dynamics and realistic resource limitations. It is possible to estimate the ultimate limitation of physically feasible computing from the speed of light as limit for signal distribution and the constraint Heisenberg’s uncertainty principle places on discerning system states [5]. Of more immediate interest are the limitations that arise from the need of any real computation process to represent information by physical degrees of freedom. Accordingly, over the course of a computation, for every change in abstract information there has to be a corresponding change of the physical state of the hardware that implements the computation. The course of computation restricts at each stage the permissible physical states of the hardware to a subset of its possible states with concomitant thermodynamic constraints. As a result of the thermodynamic effects energy has to be expended to process information, in particular there is a fundamental minimum of energy that is required for state preparation [6]. Careful analysis revealed that it is possible to trade energy consumption, reliability, and computing speed against each other [7, 8]. The energy cost of logic operations can in principle be made arbitrarily small at the expense of speed. However, if the device would be required to successfully perform a computation, the speed for this computation cannot be arbitrarily small because the physical structure of the computing device itself will degrade over time and thus there will be minimum speed required to allow for completing the computation [9]. Consequently, there will also be a minimum energy that is necessary for driving the computation towards completion. The above consideration pertains to a computer prepared for a computation or for repeating a computational cycle; the state preparation can also be viewed as resetting the machine. Hanson rightly draws attention to the fact that a robot interacting continously with a changing envrionment will face additional constraints on the tradeoff between energy, reliability, and speed [10]. The minimal speed of computation needs to be adequate to respond in real-time to the challenges posed by the environment, and moreover, interaction with an unpredictably changing envrionment reduces reversibility and incurs an additional cost for state preparation to delete outdated information. Living systems, as previously stated, require computation not only to interact with their environment but also, more fundamentally, to actively maintain the intricate structure corresponding to the living state. To avoid thermodynamic equilibration a living system has to continously process the insults imposed on its organisation by the external and internal environment and take rather detailed control of its microstate to pilot it within the set of states compatible with life [11, pp.14–32]. The need for circuiting states incompatible with life places a lower bound on speed and reliability for the computation necessary to dissipate perturbations. Owing to this, computational efficiency is important to a living system and, arguably, could even be a limiting factor for its complexity. If we adopt the above perspective and the importance it assigns to information processing for the living state, it is worthwhile to inquire into the principles of natural information processing architectures. 2 The role of the physical substrate A computer is a system that starts from a state encoding specific information and follows the laws of nature to arrive in a state that can be interpreted as information derived from the starting state (cf. [12]). This general definition encompasses any physical system as an extreme case, because any system could be viewed as computing its own behaviour. Of course, this is a trivial form of information processing as it eliminates any freedom with regard to the representation of information and its processing. This form of processing is very limited and highly specialised. But it is also highly efficient with regard to the amount of matter and time required. The other extreme within the above definition is occupied by the conventional computer. In this case, the physical representation of information is dissociated from the course of computation that maps the initial state into the result state. The mapping is formally prescribed and arbitrary with regard to the physical interactions that put it into action. It follows that the physical substrate used to implement the formalism is largely irrelevant in this extreme case. In such a system the state-evolution is contrived by high energy barriers and, often, averaging. As a consequence it is flexible but inefficient with respect to speed and required material. Between these extrema lies a continuum of possible information processing mechanisms that trade-off generality for efficiency. It is clear that in living systems the representation of information is more closely linked to the physical interaction of the representing structures than in a conventional computer. For example, let us take the case of gene regulation. The representation of the control mechanism is not on the basis of state transitions as it would be the case in a computer program. Instead, the structure of subcomponents is represented and behaviour emerges from the interactions of the subcomponents, i.e., RNA and proteins. The mapping from a DNA sequence to a specific protein structure or a particular RNA secondary structure is largely independent of this structure. The genetic code serves as abstract representation. In case of proteins the separation between processing and representation is facilitated by the essentially arbitrary mapping from codons to amino acids and in case of functional RNA it is provided by the multitude of sequences that can give rise to a particular secondary structure. The behaviour of the control scheme thus represented by DNA, however, is a direct consequence of the physical interactions of the subcomponents. It has been stated that “the matter that makes up living systems obeys the laws of physics in ways that are expensive to simulate computationally” [13, p. 411]. Conrad offered the conjecture “that it is impossible to simulate [such a biomolecular information processing system] by a machine to which we can communicate algorithms [. . . ] without distorting its rate of operation or the amount of hardware which it requires” [14, p. 227]. The difficulties encountered in attempts to use conventional information technology for the implementation of life-like responses bear out this position. The need of living systems to process information at a rate sufficient to maintain their material structure (‘hardware’) within an idiosyncratic set of microstates places special requirements on the substrates suitable to sustain life. This points to a more prominent role of the issue of physical substrates in information processing architectures. Particularly for architectures that ideally would posses life-like features—like, for example, robust real-time behaviour in a complex envrionment, adaptability in paradoxical and ambiguous situations, self-reconfiguration or selfrepair—the choice of substrate is likely to be critical. The physical substrate is certainly also critical to the power of evolutionary processes. It is hardly possible that a process as simple as merely reproduction, variation, and selection could yield systems of sophisticated complexity if not the substrate on which the process acts is amenable to complexification through evolution [15]. The difficulties in demonstrating emergent phenomena in simulated evolution are perhaps due to a problem with the substrate rather than the process [16]. Let us next proceed to an attempt at integrating living matter in a bio-hybrid architecture to, in the long term, endow a robot with some of the capabilities that are not readily accessible to information processing based on a conventional semiconductor substrate. 3 A practical approach 3.1 The information processing of Physarum polycephalum The slime mold Physarum polycephalum of the phylum Myxomycota can be found on decaying wood in warm humid forests. Its life-cycle includes a stage in which the organism comprises a single protoplast containing numerous nuclei. This single cell, termed ‘plasmodium’, moves Figure 1: The slime mold Physarum polycephalum in the diploid plasmodial state, the most prominent state of its life cycle. The picture shows a section of a Petri dish with 1–2% nutritionfree agar. The plasmodium is a single multi-nuclear cell and distributes material within the cell body through tubular structures (T) which are finely ramified at the growth front (G). Oat flakes (O) are supplied to feed the mold. in an amoeboid-like fashion and feeds on bacteria and other organic matter. It can easily be grown on a moist agar surface (Fig. 1). Under suitable conditions a plasmodium, which starts out with a few tenth of micrometers diameter, can grow to a giant flat cell exceeding one meter in diameter and harbouring thousands of millions of nuclei. The behaviour of the plasmodium is size-invariant. The plasmodium acts as a single integrated organism controlled by a decentralised form of information processing. It is found, for instance, that the plasmodium moves towards food sources or away from repellents as a whole cell [17]. Observations have shown too that the plasmodium can find a path through a labyrinth [18]. While in cells of micrometer size signal distribution may be facilitated by diffusion of messenger molecules, the enormous size to which plasmodia can grow necessitates an active communication infrastructure. Being a single cell, this can of course not take the from of a neuronal network. Apparently information is transmitted and processed in plasmodia of P. polycephalum by the interaction of local oscillations that also give rise to periodic contractions and expansions of the plasmodium. These spatially synchronised oscillations can be observed in every region of the cell body. If white light, which acts as repellent, shines on a local part of a P. polycephalum cell, the oscillation frequency at the stimulated location decreases and desynchronises from the globally synchronised state [19, 20]. The desynchronisation brings about a phase difference between the oscillating rhythm in the stimulated location and oscillations in the remaining parts of the cell. The phase difference propagates to other parts of the cell body through protoplasmic streaming and eventually affects global behaviour and results, for instance, in the escape of the organism from the lit zone. This form of information processing in plasmodia has been modelled with systems of coupled non-linear oscillators [21, 22]. It also inspired the control scheme for a highly modular robot body with a morphological plasticity that resembles the shape change of a plasmodium [23].
منابع مشابه
Isolation and Characterisation of Anti-diabetic Pharmacological Activities of Phytoestrogens
متن کامل
A Genetic Based Resource Management Algorithm Considering Energy Efficiency in Cloud Computing Systems
Cloud computing is a result of the continuing progress made in the areas of hardware, technologies related to the Internet, distributed computing and automated management. The Increasing demand has led to an increase in services resulting in the establishment of large-scale computing and data centers, in addition to high operating costs and huge amounts of electrical power consumption. Insuffic...
متن کاملCloud Computing Application and Its Advantages and Difficulties in the Teaching Process
The objective of this research is to identify the technology of cloud computing in terms of its concept, its development, its objectives, its components, models, classifications, and the advantages of its use in the teaching process at the University of Samarra, as well as to identify the most important challenges and obstacles that teachers face in using University of Samarra. The researcher u...
متن کاملFault Tolerant DNA Computing Based on Digital Microfluidic Biochips
Historically, DNA molecules have been known as the building blocks of life, later on in 1994, Leonard Adelman introduced a technique to utilize DNA molecules for a new kind of computation. According to the massive parallelism, huge storage capacity and the ability of using the DNA molecules inside the living tissue, this type of computation is applied in many application areas such as me...
متن کاملWEAR REESISTANCE OF NANOSTRUCTURED AND CONVENTIONAL YTTRIA-STABILIZED ZIRCONIA COATINGS
Partially stabilized zirconia (PSZ) has been proven to be an excellent candidate as a thermal barrier coating (TBe) for hot sections in, for instance, heat or internal combustion engines and gas turbine parts. The main functions of these coatings are reducing heat losses, reducing fuel consumption, increasing efficiency, and extending durability and life. One of the main problems involved is we...
متن کاملReal-Time Building Information Modeling (BIM) Synchronization Using Radio Frequency Identification Technology and Cloud Computing System
The online observation of a construction site and processes bears significant advantage to all business sector. BIM is the combination of a 3D model of the project and a project-planning program which improves the project planning model by up to 6D (Adding Time, Cost and Material Information dimensions to the model). RFID technology is an appropriate information synchronization tool between the...
متن کامل